# Text Understanding

Persian Question Generator
Apache-2.0
This is a fine-tuned mT5 model for generating questions from Persian text.
Question Answering System Transformers Other
P
myrkur
57
1
Wasmai 7b V1
This is a transformers model hosted on Hugging Face Hub. Specific functionalities and use cases require further information.
Large Language Model Transformers
W
wasmdashai
133
1
Spanbert Qa
A question-answering model fine-tuned on SpanBERT/spanbert-base-cased, suitable for reading comprehension tasks
Question Answering System Transformers
S
vaibhav9
24
0
Rbt4 H312
Apache-2.0
MiniRBT is a Chinese small pre-trained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Large Language Model Transformers Chinese
R
hfl
34
5
Minirbt H288
Apache-2.0
MiniRBT is a Chinese small pretrained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Large Language Model Transformers Chinese
M
hfl
405
8
Minirbt H256
Apache-2.0
MiniRBT is a small Chinese pre-trained model based on knowledge distillation technology, combined with whole word masking, suitable for various Chinese natural language processing tasks.
Large Language Model Transformers Chinese
M
hfl
225
7
Rubert Tiny Squad
MIT
A Russian Q&A model fine-tuned from cointegrated/rubert-tiny2, suitable for SQuAD-format question answering tasks
Question Answering System Transformers
R
Den4ikAI
32
0
Distilbert Base Uncased Combined Squad Adversarial
Apache-2.0
This model is a fine-tuned version of distilbert-base-uncased on the SQuAD adversarial dataset, suitable for question-answering tasks.
Question Answering System Transformers
D
stevemobs
15
0
Opt 6.7b
Other
OPT is an open pre-trained Transformer language model developed by Meta AI, containing 6.7B parameters, designed to advance research in large-scale language models.
Large Language Model English
O
facebook
72.30k
116
Distilbert Base Uncased Finetuned Squad
Apache-2.0
A model fine-tuned on Q&A datasets based on Distilled BERT Base, suitable for Q&A tasks
Question Answering System Transformers
D
jhoonk
15
0
Simpledataset
Apache-2.0
A model fine-tuned based on distilroberta-base, with specific uses and training data not clearly stated
Large Language Model Transformers
S
DioLiu
174
0
Erlangshen Roberta 110M Sentiment
Apache-2.0
A fine-tuned version based on the Chinese RoBERTa-wwm-ext-base model on multiple sentiment analysis datasets
Text Classification Transformers Chinese
E
IDEA-CCNL
16.19k
70
Distilroberta Base 1
Apache-2.0
A fine-tuned version based on the distilroberta-base model, suitable for text-related tasks
Large Language Model Transformers
D
uhlenbeckmew
56
0
Bert Base Uncased Wiki Scouting
Apache-2.0
A model fine-tuned based on bert-base-uncased-wiki, with unspecified specific tasks
Large Language Model Transformers
B
amanm27
38
0
Qa Roberta Base Chinese Extractive
This is a RoBERTa-Base QA model fine-tuned on Chinese corpora, suitable for extractive QA tasks.
Question Answering System Transformers Chinese
Q
liam168
34
9
Ltrc Roberta
This is a RoBERTa model trained on 8.8 million Telugu sentences, specifically optimized for Telugu natural language processing tasks.
Large Language Model Transformers
L
ltrctelugu
52
0
Chinese Pert Base
PERT is a Chinese pre-trained model based on BERT, focusing on improving Chinese text processing capabilities.
Large Language Model Transformers Chinese
C
hfl
131
13
Roberta Base
A RoBERTa model pretrained on Korean, suitable for various Korean natural language processing tasks.
Large Language Model Transformers Korean
R
klue
1.2M
33
Bert Base Uncased Finetuned Docvqa
Apache-2.0
BERT-based model fine-tuned on Document Visual Question Answering (DocVQA) tasks
Question Answering System Transformers
B
tiennvcs
60
1
Chinese Bert Wwm
Apache-2.0
A Chinese pre-trained BERT model using whole word masking strategy, designed to accelerate Chinese natural language processing research.
Large Language Model Chinese
C
hfl
28.52k
79
Mt5 Base Chinese Qg
This is a Chinese question generation model based on the MT5 architecture, capable of automatically generating relevant questions from given Chinese text.
Question Answering System Transformers
M
algolet
103
17
Distilbert Base Uncased Squad2 With Ner With Neg With Repeat
A question answering and named entity recognition model fine-tuned on the conll2003 dataset based on distilbert-base-uncased-squad2
Question Answering System Transformers
D
andi611
20
0
Chinese Xlnet Base
Apache-2.0
An XLNet pre-trained model for Chinese, aimed at enriching Chinese natural language processing resources and providing diversified choices for Chinese pre-trained models.
Large Language Model Transformers Chinese
C
hfl
1,149
31
Albert Base V2 Finetuned Squad
Apache-2.0
A fine-tuned version of albert-base-v2 on Q&A datasets, suitable for question-answering tasks
Question Answering System Transformers
A
knlu1016
16
0
Bert Base Uncased Finetuned QnA V1
Apache-2.0
A fine-tuned version of bert-base-uncased model for Q&A tasks, suitable for English question answering
Question Answering System Transformers
B
mujerry
23
0
Distilbert Base Uncased
Apache-2.0
DistilBERT is a distilled version of the BERT base model, maintaining similar performance while being more lightweight and efficient, suitable for natural language processing tasks such as sequence classification and token classification.
Large Language Model English
D
distilbert
11.1M
669
Distilbert Base Pt Cased
Apache-2.0
This is a compact version of distilbert-base-multilingual-cased, specifically designed for Portuguese, while maintaining the accuracy of the original model.
Large Language Model Transformers Other
D
Geotrend
46
2
Est Roberta
Est-RoBERTa is a monolingual Estonian BERT-like model based on the RoBERTa architecture, trained on 2.51 billion Estonian vocabulary tokens.
Large Language Model Transformers Other
E
EMBEDDIA
155
4
Bertjewdialdata
Dutch BERT model fine-tuned based on GroNLP/bert-base-dutch-cased
Large Language Model Transformers
B
Jeska
20
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase